12 research outputs found

    How Haptic Size Sensations Improve Distance Perception

    Get PDF
    Determining distances to objects is one of the most ubiquitous perceptual tasks in everyday life. Nevertheless, it is challenging because the information from a single image confounds object size and distance. Though our brains frequently judge distances accurately, the underlying computations employed by the brain are not well understood. Our work illuminates these computions by formulating a family of probabilistic models that encompass a variety of distinct hypotheses about distance and size perception. We compare these models' predictions to a set of human distance judgments in an interception experiment and use Bayesian analysis tools to quantitatively select the best hypothesis on the basis of its explanatory power and robustness over experimental data. The central question is: whether, and how, human distance perception incorporates size cues to improve accuracy. Our conclusions are: 1) humans incorporate haptic object size sensations for distance perception, 2) the incorporation of haptic sensations is suboptimal given their reliability, 3) humans use environmentally accurate size and distance priors, 4) distance judgments are produced by perceptual “posterior sampling”. In addition, we compared our model's estimated sensory and motor noise parameters with previously reported measurements in the perceptual literature and found good correspondence between them. Taken together, these results represent a major step forward in establishing the computational underpinnings of human distance perception and the role of size information.National Institutes of Health (U.S.) (NIH grant R01EY015261)University of Minnesota (UMN Graduate School Fellowship)National Science Foundation (U.S.) (Graduate Research Fellowship)University of Minnesota (UMN Doctoral Dissertation Fellowship)National Institutes of Health (U.S.) (NIH NRSA grant F32EY019228-02)Ruth L. Kirschstein National Research Service Awar

    On the Origins of Suboptimality in Human Probabilistic Inference

    Get PDF
    Humans have been shown to combine noisy sensory information with previous experience (priors), in qualitative and sometimes quantitative agreement with the statistically-optimal predictions of Bayesian integration. However, when the prior distribution becomes more complex than a simple Gaussian, such as skewed or bimodal, training takes much longer and performance appears suboptimal. It is unclear whether such suboptimality arises from an imprecise internal representation of the complex prior, or from additional constraints in performing probabilistic computations on complex distributions, even when accurately represented. Here we probe the sources of suboptimality in probabilistic inference using a novel estimation task in which subjects are exposed to an explicitly provided distribution, thereby removing the need to remember the prior. Subjects had to estimate the location of a target given a noisy cue and a visual representation of the prior probability density over locations, which changed on each trial. Different classes of priors were examined (Gaussian, unimodal, bimodal). Subjects' performance was in qualitative agreement with the predictions of Bayesian Decision Theory although generally suboptimal. The degree of suboptimality was modulated by statistical features of the priors but was largely independent of the class of the prior and level of noise in the cue, suggesting that suboptimality in dealing with complex statistical features, such as bimodality, may be due to a problem of acquiring the priors rather than computing with them. We performed a factorial model comparison across a large set of Bayesian observer models to identify additional sources of noise and suboptimality. Our analysis rejects several models of stochastic behavior, including probability matching and sample-averaging strategies. Instead we show that subjects' response variability was mainly driven by a combination of a noisy estimation of the parameters of the priors, and by variability in the decision process, which we represent as a noisy or stochastic posterior

    Abstract

    No full text
    Perceptual Bistability refers to the phenomenon of spontaneously switching between two or more interpretations of an image under continuous viewing. Although switching behavior is increasingly well characterized, the origins remain elusive. We propose that perceptual switching naturally arises from the brain’s search for best interpretations while performing Bayesian inference. In particular, we propose that the brain explores a posterior distribution over image interpretations at a rapid time scale via a sampling-like process and updates its interpretation when a sampled interpretation is better than the discounted value of its current interpretation. We formalize the theory, explicitly derive switching rate distributions and discuss qualitative properties of the theory including the effect of changes in the posterior distribution on switching rates. Finally, predictions of the theory are shown to be consistent with measured changes in human switching dynamics to Necker cube stimuli induced by context.

    Influence of retinal image shifts and extra-retinal eye movement signals on binocular rivalry alternations

    Get PDF
    Contains fulltext : 125956.pdf (publisher's version ) (Open Access)Previous studies have indicated that saccadic eye movements correlate positively with perceptual alternations in binocular rivalry, presumably because the foveal image changes resulting from saccades, rather than the eye movement themselves, cause switches in awareness. Recently, however, we found evidence that retinal image shifts elicit so-called onset rivalry and not percept switches as such. These findings raise the interesting question whether onset rivalry may account for correlations between saccades and percept switches. We therefore studied binocular rivalry when subjects made eye movements across a visual stimulus and compared it with the rivalry in a 'replay' condition in which subjects maintained fixation while the same retinal displacements were reproduced by stimulus displacements on the screen. We used dichoptic random-dot motion stimuli viewed through a stereoscope, and measured eye and eyelid movements with scleral search-coils. Positive correlations between retinal image shifts and perceptual switches were observed for both saccades and stimulus jumps, but only for switches towards the subjects' preferred eye at stimulus onset. A similar asymmetry was observed for blink-induced stimulus interruptions. Moreover, for saccades, amplitude appeared crucial as the positive correlation persisted for small stimulus jumps, but not for small saccades (amplitudes < 1 degrees ). These findings corroborate our tenet that saccades elicit a form of onset rivalry, and that rivalry is modulated by extra-retinal eye movement signals
    corecore